Fast and provable tensor robust principal component analysis via scaled gradient descent

نویسندگان

چکیده

Abstract An increasing number of data science and machine learning problems rely on computation with tensors, which better capture the multi-way relationships interactions than matrices. When tapping into this critical advantage, a key challenge is to develop computationally efficient provably correct algorithms for extracting useful information from tensor that are simultaneously robust corruptions ill-conditioning. This paper tackles principal component analysis (RPCA), aims recover low-rank its observations contaminated by sparse corruptions, under Tucker decomposition. To minimize memory footprints, we propose directly low-dimensional factors—starting tailored spectral initialization—via scaled gradient descent (ScaledGD), coupled an iteration-varying thresholding operation adaptively remove impact corruptions. Theoretically, establish proposed algorithm converges linearly true at constant rate independent condition number, as long level not too large. Empirically, demonstrate achieves more scalable performance state-of-the-art RPCA through synthetic experiments real-world applications.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor principal component analysis via convex optimization

This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is supersymmetric with an even degree. In that case, the tensor can be embedded into a symmetric matrix. We prove that if the tensor...

متن کامل

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression problems. LING has the same risk as the well known Ridge Regression under the fixed design setting and can be computed much faster. Our experiments have shown that LING performs well in terms of both prediction accuracy and computational efficiency compared with other large scale regression algorithms like Gradient Descent, St...

متن کامل

FRPCA: Fast Robust Principal Component Analysis

While the performance of Robust Principal Component Analysis (RPCA), in terms of the recovered low-rank matrices, is quite satisfactory to many applications, the time efficiency is not, especially for scalable data. We propose to solve this problem using a novel fast incremental RPCA (FRPCA) approach. The low rank matrices of the incrementally-observed data are estimated using a convex optimiza...

متن کامل

Fast Algorithms for Robust PCA via Gradient Descent

We consider the problem of Robust PCA in the the fully and partially observed settings. Without corruptions, this is the well-known matrix completion problem. From a statistical standpoint this problem has been recently well-studied, and conditions on when recovery is possible (how many observations do we need, how many corruptions can we tolerate) via polynomialtime algorithms is by now unders...

متن کامل

Scaled Gradient Descent Learning Rate

Adaptive behaviour through machine learning is challenging in many real-world applications such as robotics. This is because learning has to be rapid enough to be performed in real time and to avoid damage to the robot. Models using linear function approximation are interesting in such tasks because they offer rapid learning and have small memory and processing requirements. Adalines are a simp...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Information and Inference: A Journal of the IMA

سال: 2023

ISSN: ['2049-8772', '2049-8764']

DOI: https://doi.org/10.1093/imaiai/iaad019